![]() Method, control unit and system for avoiding collision with vulnerable road users
专利摘要:
27 SUMMARY Method (600) and control unit (310), for avoiding a potential collision between the vehicle(100) and a Vulnerable Road User, VRU (200). The method (600) comprises: predicting(601) a future path (t1, t2, t3) of the vehicle (100); detecting (602) the VRU (200) and theposition of the VRU (200); determining (603) velocity of the detected (602) VRU (200); pre-dicting (604) a future position (210) of the detected (602) VRU (200), based on the VRUposition upon detection (602) and the determined (603) VRU velocity; and performing (607)an action for avoiding a collision, when the predicted (604) future position (210) of the VRU(200) is overlapping (220) the predicted (601) future path (t1, t2, t3) of the vehicle (100). (Publ. Fig. s) 公开号:SE1551086A1 申请号:SE1551086 申请日:2015-08-20 公开日:2017-02-21 发明作者:Andersson Jonny;Bemler Marie;Ah-King Joseph;Larsson Christian 申请人:Scania Cv Ab; IPC主号:
专利说明:
METHOD, CONTROL UNIT AND SYSTEM FOR VRU WARNING TECHNICAL FIELD This document relates to a method, a control unit and a system in a vehicle. More particu-larly, a method, a control unit and a system is described, for avoiding a potential collisionbetween the vehicle and a Vulnerable Road User (VRU). BACKGROUND Non-motorised road users, such as e.g. pedestrians and cyclists as well as motorcyclistsand persons with disabilities and/ or reduced mobility and orientation are sometimes re-ferred to as Vulnerable Road Users (VRU). This heterogeneous group is disproportionatelyrepresented in statistics on injuries and road traffic casualties. A particularly dangerous scenario is when VRUs are situated in the vehicle driver's blindspot when the vehicle is turning at low speeds. ln addition, pedestrians sometimes try crossing the street on a road sequence without be-ing aware of the problems for the driver to see the pedestrian, assuming that the vehicledriver will let the pedestrian pass (which assumption may become lethal in case the driverdoes not see the pedestrian). Another similar problem may appear when driving in city traffic when a bicycle is approach-ing a vehicle from behind on the inside, while the vehicle is turning right. The bicyclist maythen not be able to see the turning indicators of the vehicle, while the vehicle driver maynot be able to see the bicyclist, which may result in a serious accident. The above described scenarios may be in particular severe when the vehicle is a large,sight blocking vehicle such as e.g. a bus, a truck or similar, but also a private car may blockthe sight of an undersized pedestrian, such as e.g. a child, a wheelchair user or a pet. No advanced warning systems for VRUs in a vehic|e's blind zone is yet known. Simplesystems exist on the market today, which are based on ultrasonic sensors which identifythe presence of “anything” next to the vehicle when turning or when using turn indicators. Environment sensors according to previously known VRU warning systems will detect alarge number of objects in a city environment, both harmless objects such as lamp posts,traffic signs, parked bicycles, etc., and VRUs. However, they are not capable of distinguish- ing between harmless immobile objects and VRUs which are only temporally immobile. lnorder to create a trustworthy and robust VRU warning system, it is important that the sys-tem warns only for dangerous situations involving VRUs, without generating false warningsfor irrelevant situations. Furthermore it is important to predict when a driver/ vehicle is about to take a sharp turnbefore it happens in order to build a reliable VRU warning function in a vehicle. A path pre-diction that is too restrictive will most likely ignore or delay warnings in some dangeroussituations, while a too generous path prediction is most likely to give lots of “false” warningsas soon as someone is walking near the vehicle, such as e.g. on the sidewalk separatedfrom the road. Thus it would be desired to develop an improved VRU warning system. SUMMARY lt is therefore an object of this invention to solve at least some of the above problems andimprove the traffic security. According to a first aspect of the invention, this objective is achieved by a method in a ve-hicle for avoiding a potential collision between the vehicle and a Vulnerable Road User(VRU). The method comprises predicting a future path of the vehicle. The method furthercomprises detecting the VRU and the position of the VRU. ln addition the method com-prises determining velocity of the detected VRU. Also, the method furthermore comprisespredicting a future position of the detected VFiU, based on the VRU position upon detectionand the determined VRU velocity. The method also comprises performing an action foravoiding a collision, when the predicted future position of the VRU is overlapping the pre-dicted future path of the vehicle. According to a second aspect of the invention, this objective is achieved by a control unit ina vehicle. The control unit is configured for avoiding a potential collision between the vehi-cle and a VRU. The control unit is configured for predicting a future path of the vehicle.Furthermore the control unit also is configured for detecting the VRU and the position of theVFlU via a sensor. The control unit is further configured for determining velocity of the de-tected VRU. Also, the control unit is additionally configured for predicting a future positionof the detected VRU based on the position of the detected VRU and the determined VRUvelocity. The control unit is also configured for performing an action for avoiding a collision, when the predicted future position of the VRU is overlapping the predicted future path ofthe vehicle. According to a third aspect of the invention, this objective is achieved by a computer pro-gram comprising program code for performing a method according to the first aspect whenthe computer program is executed in a control unit according to the second aspect. According to a fourth aspect, this objective is achieved by a system for avoiding a potentialcollision between the vehicle and a VFlU. The system comprises a control unit according tothe second aspect. Further the system also comprises a sensor on the vehicle, configuredfor detecting the VFlU and the position of the VFlU. The system in addition also comprisesa warning emitting device on the vehicle, configured for emitting a warning for avoiding a collision. Thanks to the described aspects, a reliable VFlU warning and collision avoidance system isachieved, based on an accurate path prediction of the vehicle, and a reliable VFlU detec-tion and predictlon of VFlU future path. Thereby a warning system is achieved that warns/intervenes only when a collision with a VFlU is really probable, i.e. when the predicted pathof the vehicle and a predicted path for the VFlU are overlapping. Such system will gain highacceptance and trust as superfluous warnlngs are eliminated or at least reduced, which inturn is expected to reduce fatalities of turn accidents. Thus increased traffic security isachieved. Other advantages and additional novel features will become apparent from the subsequentdetailed description. FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1 illustrates a vehicle according to an embodiment of the invention; Figure 2 illustrates an example of a traffic scenario and an embodiment of the inven-tion; Figure 3 illustrates an example of a vehicle interior according to an embodiment; Figure 4A illustrates an example of a traffic scenario and an embodiment of the inven- tion; Figure 4B illustrates an example of a traffic scenario and an embodiment of the inven-tion;Figure 5 illustrates an example of a vehicle interior according to an embodiment;Figure 6 is a flow chart illustrating an embodiment of the method; andFigure 7 is an illustration depicting a system according to an embodiment. DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method, a control unit anda system, which may be put into practice in the embodiments described below. These em-bodiments may, however, be exemplified and realised in many different forms and are notto be limited to the examples set forth herein; rather, these illustrative examples of em-bodiments are provided so that this disclosure will be thorough and complete. Still other objects and features may become apparent from the following detailed descrip-tion, considered in conjunction with the accompanying drawings. lt is to be understood,however, that the drawings are designed solely for purposes of illustration and not as adefinition of the limits of the herein disclosed embodiments, for which reference is to bemade to the appended claims. Further, the drawings are not necessarily drawn to scaleand, unless othen/vise indicated, they are merely intended to conceptually illustrate thestructures and procedures described herein. Figure 1 illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in adriving direction 105. The vehicle 100 may comprise e.g. a truck, a bus or a car, or any similar vehicle or othermeans of conveyance. Further, the herein described vehicle 100 may be driver controlled or driverless, autono-mously controlled vehicles 100 in some embodiments. However, for enhanced clarity, theyare subsequently described as having a driver. The vehicle 100 comprises a camera 110 and a sensor 120. ln the illustrated embodiment,which is merely an arbitrary example, the camera 110 may be situated e.g. at the front ofthe vehicle 100, behind the windscreen of the vehicle 100. An advantage by placing thecamera 110 behind the windscreen is that the camera 110 is protected from dirt, snow, rainand to some extent also from damage, vandalism and/ or theft. The camera 110 may be directed towards the front of the vehicle 100, in the driving direc-tion 105. Thereby, the camera 110 may detect a VFlU in the driving direction 105 ahead ofthe vehicle 100. The camera may comprise e.g. a camera, a stereo camera, an infraredcamera, a video camera, an image sensor, a thermal camera and/ or a time-of-flight cam- era in different embodiments. Mounting the camera 110 behind the windshield (looking forward) have some advantagescompared to externally mounted camera systems. These advantages include the possibilityto use windshield wipers for cleaning and using the light from headlights to illuminate ob-jects in the camera's field of view. Such multi-function camera 110 can also be used for avariety of other tasks, such as detecting objects in front of the vehicle 100, assisting in es-timating the distance to an object in front of the vehicle 100, etc. The sensor 120 may be situated at the side of the vehicle 100, arranged to detect objectsat the side of the vehicle 100. The sensor 120 may comprise e.g. a radar, a lidar, an ultra-sound device, a time-of-flight camera, and/ or similar in different embodiments. ln some embodiments, the sensor 120 may comprise e.g. a motion detector and/ or bebased on a Passive Infrared (PIR) sensor sensitive to a persons skin temperature throughemitted black body radiation at mid-infrared wavelengths, in contrast to background objectsat room temperature; or by emitting a continuous wave of microwave radiation and detectmotion through the principle of Doppler radar; or by emitting an ultrasonic wave an detect-ing and analysing the reflections; or by a tomographic motion detection system based ondetection of radio wave disturbances, to mention some possible implementations. By using at least one camera 110 and at least one sensor 120, the advantages of the re-spective type of device may be combined. The advantage of the camera 110 is that it isenabled to distinguish between e.g. a VRU and another object, also when the VRU is sta-tionary. The advantages of the sensor 120 are the detection range, price, robustness andability to operate in all weather conditions. Thereby high confidence detections and classifi-cations may be achieved. Thanks to the combination of the camera 110, which may detectthe VRU also when it is stationary, and the sensor 120, which may track any VRU detectedby the camera 110, a high performance function of a VFlU warning/ intervention system isachieved, possibly without adding any side viewing camera to the vehicle 100. Thereby theneed for dedicated side viewing VRU detection sensors may be eliminated. By having overlapping fields of view with a side-looking sensor 120 and the camera 110,stationary VFlUs can be first detected with the camera 110 when passing them and then“tracked“ with the sensor 120 outside the field of view from the camera 110. This allows forVRU warning/ intervention on stationary objects even outside the field of view from thecamera 110 which is required for VRU warning in the driver's blind spot. However, the side-looking sensor 120 and the camera 110 do not necessarily require hav-ing overlapping fields of view; they may as well have fields of view adjacent to each other,or with a gap in between. A calculation may in the latter case be made for mapping an ob-ject detected by the camera 110 with the same object detected by the side-looking sensor120, in some embodiments. Figure 2 schematically illustrates a scenario, similar to the previously discussed scenarioillustrated in Figure 1, but seen from an above perspective and wherein a predicted futurepath of the vehicle 100 is depicted. A possible path of the vehicle 100 is predicted by using available information. The pathprediction comprises determining steering wheel angle and steering wheel rate, and possi-bly also determining if direction indicators are activated. Further, in some embodiments, thepath prediction may also use a camera system that can detect the road surface or naturalborders of the road such as elevated sidewalks etc., to improve the path prediction. lf high-resolution map data is available, similar effects can be gained by increasing the probabilityof a turn near an intersection. The prediction is based on formula [1] for calculating the steady-state relationship betweensteering wheel angle and yaw rate of the vehicle 100: %W*v=n*(L+Kw*vÖ*w [Hwhere w = yaw rate (rad/s); asw = steering wheel angle (rad); v = vehicle speed; L = ef-fective wheel base (distance from front axle to effective rotation centre); and Kus = under- steer gradient (sz/m).At low speeds (which are normally relevant for VRU warning systems), the term Kus * 122 may be neglected for simplification, leading to: %W*v=n*L*w. [H Assuming that asw, ásw (steering angle rate) and direction indicator signals can be meas-ured, the possible path can be calculated as: aswrf) = auto) + f; dswrffdf = aswro) + fr; äswdf, [81where the steering wheel acceleration, äsw, is assumed to be constant during the turn. Thespecific value of äsw may be set depending on ego vehicle speed and/ or if the turn indica-tor (for this side) is on according to some embodiments. Using equations [2] and [3], the yaw rate w for each relevant time step is calculated. Cer-tain limits on steering wheel angle and/ or steering wheel rate can also be applied to limitthe path prediction when the driver quickly steers to one side. For example, for some vehi-cle types it might be reasonable to assume that a turn is never more than 90 degreeswithin a given time frame. For other vehicles, such as a truck with trailer, it might be neces-sary to steer more to negotiate certain turns. Furthermore, buses with large overhang takeswide curves to negotiate turns, which may also be taken into account in the predictions insome embodiments. ln some embodiments, the vehicle 100 comprises a camera system. The camera systemmay be able to detect the road surface or natural borders of the road, such as elevatedsidewalks etc. Thereby the path prediction may be improved, for example by limiting thepath by assuming that the own vehicle 100 stays on the road, or by lowering or limiting thevalue for äsw when the vehicle 100 is close to the road border. Thereby the number of falsewarnings for VRUs, such as pedestrians/ bicyclists that reside close to the own vehicle 100but on an elevated sidewalk may be avoided or at least reduced. ln the illustrated arbitrary example, the vehicle 100 is driving straight forward on the road ina first time frame t0, i.e. the yaw rate w is zero. By measuring the velocity v of the vehicle100, the steering wheel angle asw and the steering angle rate dsw, and by using equations[2] and [3], the yaw rate wf for each time frame t1 is calculated. By iterating the calcula-tions of equations [2] and [3], based on the predicted position in time frame t1, the yawrates w2, w3 and vehicle positions in time frames t2 and t3 may be predicted. lt maythereby be predicted that the vehicle 100 is turning to the right, in this example. An accurate path prediction is the backbone for creating a reliable VRU warning systemthat only warns/ intervenes when a collision with a VRU is really probable and impending. Such system will gain higher acceptance and trust which in turn is expected to reduce fa-talities of turn accidents. However, the disclosed method for path prediction of the vehicle 100 is not limited to VRUwarning systems, but may be used for various other purposes. Furthermore a VFlU 200 is detected by the camera 110 and/ or the sensor 120 in the vehi-cle 100. The VFlU 200 is moving in a walking direction 205. The position 210 of the VRU200 in some future time frames is predicted, based on a velocity estimation of the VRU200. Depending on the type of sensor 120, a classification of relevant objects may be done.When the sensor 120 comprises a radar or lidar, objects that have been seen moving maybe classified as relevant. For the camera 110, objects that are recognised as a VRU 200may be classified as relevant. Also, objects that are stationary but have been recognisedas a VRU 200 by the camera 110 can be tracked outside the camera's field of view andcan hence be classified as relevant, as further discussed and explained in Figure 4A andFigure 4B. Depending on the probability of the VRU position in the possible path 210, different stagesof warning/ intervention can be made to the driver of the vehicle 100/ or the vehicle 100.Warning/ intervention may only be done when a set of general conditions are fulfilled,which may comprise a limit of ego vehicle speed, such as for example, 0< v< 30 km/h; and/or a certain angle of the steering wheel or the activation of the turn indicator to the relevantside. For example, the following actions 1-4 may be initiated at different probabilities in someembodiments. (1) lamp in the vehicle 100, or by vibrating the steering wheel/ driver's chair etc. P > p1 (collision possible). A silent warning may be shown for example with a diode/ (2) emitted, alone or on top of action (1) in different embodiments. P > p2 (collision with the VFlU 200 probable). An audible warning sound may be (3) ed and/ or a steering wheel torque is induced to counteract the driver's turning action. P > p3 (collision with the VRU 200 very imminent) a short brake jerk may be initiat- (4)brake to standstill to avoid running over the VFlU 200 with any wheel etc. P > p4 (if collision is not possible to avoid or has already happened) automatic Based on the ego vehicle's “possible path” a trajectory of the vehicle 100 may be calcu-lated. Simulation of the ego vehicle's future position may be done over a number of timesteps to a maximum simulation time tm”. By measuring position and velocity (i.e. speedand direction) of detected objects/ VFlUs 200, the probability distribution of the VRUs' pos-sible future positions may be calculated using a movement model. The probability of the object's position in the ego vehicle's path may then be calculated, i.e.a measurement of how likely it is that the VRU 200 will be in the vehicle path. Thereby a VFlU warning system is achieved that only warns/ intervenes when a collisionwith the VFiU 200 is really probable. Such VRU warning system will gain high acceptanceand trust which in turn is expected to reduce fatalities of turn accidents. Figure 3 illustrates an example of a vehicle interior of the vehicle 100 and depicts how thepreviously scenario in Figure 1 and/ or Figure 2 may be perceived by the driver of the vehi-cle 100. The vehicle 100 comprises a control unit 310. The control unit 310 is able to obtain meas-urements required to perform the calculations according to equations [2] and [3]. Furtherthe vehicle 100 also comprises sensor 320 for measuring steering wheel angle usw andsteering wheel angle rate dgw of the steering wheel of the vehicle 100. ln some embodi-ments, two or more sensors 320 may be utilised, such as e.g. one sensor 320 for measur-ing the steering wheel angle dsw and a separate sensor 320 for measuring the steeringwheel angle rate dgw. The velocity of the vehicle 100 may be measured or estimated by the speedometer in thevehicle, or by the positioning device 330. The geographical position of the vehicle 100 may be determined by a positioning device330, or navigator, in the vehicle 100, which may be based on a satellite navigation systemsuch as the Navigation Signal Timing and Fšanging (Navstar) Global Positioning System(GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like. The geographical position of the positioning device 330, (and thereby also of the vehicle100) may be made continuously with a certain predetermined or configurable time intervalsaccording to various embodiments. Positioning by satellite navigation is based on distance measurement using triangulationfrom a number of satellites 340-1, 340-2, 340-3, 340-4. ln this example, four satellites 340-1, 340-2, 340-3, 340-4 are depicted, but this is merely an example. More than four satel-lites 340-1, 340-2, 340-3, 340-4 may be used for enhancing the precision, or for creatingredundancy. The satellites 340-1, 340-2, 340-3, 340-4 continuously transmit informationabout time and date (for example, in coded form), identity (which satellite 340-1, 340-2,340-3, 340-4 that broadcasts), status, and where the satellite 340-1, 340-2, 340-3, 340-4are situated at any given time. The GPS satellites 340-1, 340-2, 340-3, 340-4 sends infor-mation encoded with different codes, for example, but not necessarily based on Code Divi-sion Multiple Access (CDl/IA). This allows information from an individual satellite 340-1,340-2, 340-3, 340-4 distinguished from the others' information, based on a unique code foreach respective satellite 340-1, 340-2, 340-3, 340-4. This information can then be transmit-ted to be received by the appropriately adapted positioning device comprised in the vehi-cles 100. Distance measurement can according to some embodiments comprise measuring the dif-ference in the time it takes for each respective satellite signal transmitted by the respectivesatellites 340-1, 340-2, 340-3, 340-4 to reach the positioning device 330. As the radio sig-nals travel at the speed of light, the distance to the respective satellite 340-1, 340-2, 340-3,340-4 may be computed by measuring the signal propagation time. The positions of the satellites 340-1, 340-2, 340-3, 340-4 are known, as they continuouslyare monitored by approximately 15-30 ground stations located mainly along and near theearth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle100 may be calculated by determining the distance to at least three satellites 340-1, 340-2,340-3, 340-4 through triangulation. For determination of altitude, signals from four satellites340-1, 340-2, 340-3, 340-4 may be used according to some embodiments. Having determined the geographical position of the vehicle 100 by the positioning device330 (or in another way), it may be presented on a map, a screen or a display device wherethe position of the vehicle 100 may be marked in some optional, alternative embodiments. 11 ln some embodiments, the current geographical position of the vehicle 100 and the com-puted predicted path of the vehicle 100 may in some embodiments be displayed on an in-terface unit. The interface unit may comprise a mobile telephone, a computer, a computertablet or any similar device. Furthermore, the vehicle 100 may comprise a camera 110 in some embodiments. Thecamera 110 may be situated e.g. at the front of the vehicle 100, behind the windscreen ofthe vehicle 100. An advantage by placing the camera 350 behind the windscreen is that thecamera 110 is protected from dirt, snow, rain and to some extent also from damage, van-dalism and/ or theft. The camera 110 may be directed towards the front of the vehicle 100, in the driving direc-tion 105. Thereby, the camera 110 may detect road limitations ahead of the vehicle 100,such as an elevated sidewalk, and/ or a crossroad or road junction. Figure 4A schematically illustrates a scenario, similar to the previously discussed scenarioillustrated in Figure 2, with the vehicle 100 seen from an above perspective. When the vehicle 100 is driving in a driving direction 105, a camera 110 detects a VRU200. An image recognition program may recognise the VRU 200 as a VFlU and possiblyalso categorise it as e.g. a pedestrian, child, bicyclist, animal etc. As the vehicle 100 is driving forward in the driving direction 105 and approaching the VRU200, the VFlU 200 for a moment becomes situated in an area where it is detected both bythe camera 110 and a sensor 120. The VFlU 200 may then be mapped with the object 200detected by the sensor 120. Thereby it becomes possible for the sensor 120 to recognisethe VRU 200 as a VRU, also when the VRU 200 is stationary. However, in other embodiments, there may be no overlap in the field of view between thecamera 110 and the sensor 120, respectively. The mapping may anyway be made, basedon e.g. an estimation of the distance, direction and/ or speed of the object 200; and/ or thesize or shape of the object 200. As the vehicle 100 is advancing in the driving direction 105, the VRU 200 becomes out ofsight for the camera 110 while still being situated within range of the sensor 120, as illus-trated in Figure 4B. The VRU 200 may then be tracked by the sensor 120 for as long it issituated within detection range of the sensor 120. 12 An accurate detection and tracking of any VRU 200 in the proximity of the vehicle 100 isthe backbone for creating a reliable VRU warning system that only warns/ intervenes whena collision with a VRU is really probable and impending. Such system will gain higher ac-ceptance and trust which in turn is expected to reduce fatalities of turn accidents. However, the disclosed method for VRU detection is not limited to VRU warning systems,but may be used for various other purposes. Figure 5 illustrates an example of a vehicle interior of the vehicle 100 and depicts how thepreviously scenario in Figure 1, Figure 2 and/ or Figure 4A may be perceived by the driverof the vehicle 100. The vehicle 100 comprises a control unit 310. The control unit 310 is able to recognise theVFlU 200 as a VFlU, based on one or more images provided by the camera 110. Furtherthe control unit 310 is configured for receiving detection signals from the sensor 120 andmapping the detected VFlU with the detection signals received from the sensor 120. Also,the control unit 310 is further configured for tracking the VFlU 200 via the sensor 120, aslong as the VFlU 200 is within range of the sensor 120. As illustrated, the vehicle 100 may comprise one sensor 120-1 on the right side of the vehi-cle 100 and one sensor 120-2 on the left side in some embodiments. However, in otherembodiments, the vehicle 100 may comprise only one sensor 120 on the right side of thevehicle 100, thereby reducing the number of sensors 120 in the vehicle 100. However, inother embodiments, the vehicle 100 may comprise a plurality of sensors 120 on each sideof the vehicle 100. The sensors 120 may be of the same, or different types, such as e.g.radar, lidar, ultrasound, thermal camera, time-of-flight camera, etc. ln the illustrated example, the vehicle 100 comprises one camera 110 situated in front ofthe vehicle 100 behind the windscreen. However in other embodiments the vehicle 100may comprise a camera 110 situated at the rear part of the vehicle 100, directed in a direc-tion opposite to the normal driving direction 105. Thus detection of VFlUs 200 may bemade while backing the vehicle 100. The camera 110 may in such case be situated insidethe rear glass, in order to be protected from dirt, snow, etc. The control unit 310 may communicate with the camera 110 and sensor 120, e.g. via a communication bus of the vehicle 100, or via a wired or wireless connection. 13 The control unit 310 is calculating and predicting a future path t1, t2, t3 of the vehicle 100 ina number of time frames t1, t2, t3. The control unit 310 is also detecting the VFlU 200 viathe camera 110 and the sensor 120-1. Based on the movement direction 205 and velocityof the VRU 200, if any, a future position 210 of the VFlU 200 in the future time frames t1, t2,t3 is predicted. ln case the predicted future path t1, t2, t3 of the vehicle 100 intervene with the future posi-tion 210 of the VRU 200 in an overlap 220, an action may be performed. Such action may comprise emitting a warning to the driver, emitting a warning to the VFlU200 and/ or initiating an automatic action for avoiding a collision with the VFlU 200 by anautomatic braking and/ or automatic evasive action. The type of action may be dependent on the size of the probability for a collision. Suchprobability may be proportional to the size of the overlap 220 in some embodiments. The probability of a collision may also be dependent on a categorisation of the VFlU 200.To mention some examples, a child or an animal, in particular a game animal may increasethe probability of a collision, as children and wild animals typically may behave in an un-predicted and stochastic manner. Some VFlUs 200 may on the other hand be expected tobehave in a rather predictable way in a traffic situation, for example motorcyclists, whichcould be expected to be adult, and be aware of the risks with erratic or non-predictablebehaviour in road traffic. A motorcyclist typically hold a drivers licence and is thus aware oftraffic rules, can read traffic signs etc. ln the illustrated example, an alert is emitted in order to warn the driver and making him/her aware of the VRU 200. ln this case an auditive warning is presented to the driver froma warning emitting device 510. However, in other embodiments, such warning may com-prise a visual warning on a display, on the dashboard of the vehicle 100, on a head up dis-play, by projecting a visual warning on the windscreen, or the road in front of the vehicle100, or by a device adapted for Augmented Reality (AR). Such AR device may comprisethe windscreen of the vehicle 100, glasses of the driver, lenses of the driver, etc. Further, a haptic signal or tactile feedback may be provided in the steering wheel, driverseat or similar, for providing a silent alert to the driver. 14 ln some embodiments, a warning may be provided to the VRU 200, e.g. by flashing withthe vehicle headlights, which may be in particular effective when driving in dark or obscurelight conditions, such as at night time, in twilight, in fog, or when the sun is concealed byclouds. As previously mentioned, a plurality of warning emitting devices 510 of the vehicle100 may be activated simultaneously for warning the driver and/ or the VFlU 200, and pos-sibly also other vehicles or road users in the vicinity. ln some embodiments, a warning may be emitted by flashing vehicle headlights at nighttime, and by activating the horn of the vehicle 100 in daytime. Thereby, the VRU 200 aswell as the driver of the vehicle 100 may be notified of the danger in an effective way, whilethe disturbance of the warning for other road users or people living close is reduced. Figure 6 is a flow chart illustrating an embodiment of a method 600 in a vehicle 100. Themethod 600 aims at avoiding a potential collision between the vehicle 100 and a VFlU 200. The vehicle 100 may be e.g. a truck, a bus, a car, a motorcycle or similar. ln order to correctly be able to avoid the potential collision between the vehicle 100 and theVFlU 200, the method 600 may comprise a number of steps 601-608. However, some ofthese steps 601-608 may be performed solely in some alternative embodiments, like e.g.step 605, step 606 and/ or step 607. Further, the described steps 601-608 may be per-formed in a somewhat different chronological order than the numbering suggests. Themethod 600 may comprise the subsequent steps: Step 601 comprises predicting a future path t1, t2, t3 of the vehicle 100. ln some embodiments, the predicted future path t1, t2, t3 of the vehicle 100 may corre-spond to a first area t1, t2, t3 occupied by the vehicle 100 during a set of future timeframes. Further, the future path t1, t2, t3 of the vehicle 100 may be predicted by measuring velocityof the vehicle 100. Further the prediction may comprise measuring steering wheel angleosw and measuring steering wheel angle rate ogw. Also, furthermore the prediction maycomprise calculating a future steering wheel angle osw, based on the measured steeringwheel angle osv, and the measured steering wheel angle rate ogw. Further the predictionmay comprise calculating a future yaw rate w of the vehicle 100 based on the measuredvelocity of the vehicle 100 and the calculated future steering wheel angle osw. The predic- tion may furthermore also comprise extrapolating a vehicle position of the vehicle 100 in aset of future time frames, based on the calculated future yaw rate w and the vehicle veloc-ity. Also, the prediction of the future path t1, t2, t3 of the vehicle 100 may be based on theextrapolated vehicle positions in the set of future time frames. The extrapolated vehicle position of the vehicle 100 may comprise iteration of the steps ofcalculating the future steering wheel angle orsw and calculating a future yaw rate w of thevehicle 100. Furthermore, the steering wheel acceleration dsw" may be assumed to be constant duringthe set of future time frames and set based on measured velocity of the vehicle, and turnindicator status, in some embodiments. The prediction of the vehicle path may further be based on road border detection made bya camera 110 in the vehicle 100. The prediction of the future vehicle path may be further based on a destination of the vehi-cle 100, extracted from a navigator 330 of the vehicle 100 in some embodiments. ln some embodiments, the calculation of the future steering wheel angle dsw at a time t maybe made by: aswu) = aswro) + f; dswrodf = aswro) + fr; äswdf.Step 602 comprises detecting the VFlU 200 and the position of the VFlU 200. The detection of the VRU 200 and the position of the VRU 200 may in some embodimentscomprise detecting an object 200 by a camera 110 of the vehicle 100 and classifying thedetected object 200 as a VFlU 200. Furthermore, the detection of the VFlU 200 may com-prise detecting the object 200 by a sensor 120 of the vehicle 100. ln addition the detectionmay also comprise mapping the classified VFlU 200 with the object 200 detected by thesensor 120. Also, the detection of the VFlU 200 and the position of the VFlU 200 may inaddition comprise tracking the VFlU 200 by the sensor 120. The camera 110 may comprise e.g. a camera, a stereo camera, an infrared camera, avideo camera, or a time-of-flight camera. The sensor 120 may comprise e.g. a radar, alidar, an ultrasound device, a time-of-flight camera, and/ or similar in different embodi-ments. 16 The Classification of the detected object 200 may be made based on image recognition insome embodiments, by an image recognition program. Further, the classification may comprise a movement prediction reliability estimation of theVRU 200, wherein unattended animals and people shorter than a configurable thresholdlength are classified as having reduced movement prediction reliability. Such classification may further comprise a movement prediction reliability estimation of theVRU 200, wherein motorcycle drivers may be classified as having enhanced movementprediction reliability in some embodiments. Step 603 comprises determining velocity of the detected 602 VRU 200. Determining the velocity of the detected 602 VFtU 200 may comprise determining speedand movement direction 205 of the VFtU 200. The velocity may be determined by analysinga sequence of images of the VRU 200 during a number of time frames. Step 604 comprises predicting a future position 210 of the detected 602 VFiU 200, basedon the VRU position upon detection 602 and the determined 603 VRU velocity. The predicted future position 210 of the VRU 200 may comprise a second area 210wherein the VRU 200 is expected to be situated at the set of future time frames in someembodiments. Furthermore, in some embodiments, a probability of a collision to occur may be estimated,proportional to an overlap 220 between the first area t1 , t2, t3 and the second area 210. The probability of a collision may furthermore be increased when the VRU 200 is detected602 as an unattended animal, such as a game animal, or a person shorter than a configur-able threshold length, such as i.e. a child. Step 605 which may be performed only in some particular embodiments, comprises de-termining geographical position of the vehicle 100. The current vehicle position may be determined by a geographical positioning device 330,such as e.g. a GPS. However, the current position of the vehicle 100 may alternatively be 17 detected and registered by the camera 110 in some embodiments, by detecting e.g. a pe- destrian crossing or similar. Step 606 which may be performed only in some particular embodiments wherein the geo-graphical position of the vehicle 100 has been determined 605, comprises extracting statis-tical information related to a probability of a collision at the determined 605 geographicalposition. The probability of a collision may be increased at geographical positions where anumber of traffic accidents is exceeding a threshold limit or where the determined 605geographical position is identified as a pedestrian crossing. Such statistical information may be based on historical accidents at certain geographicalpositions, stored in a database, which may be kept in the vehicle 100, or external to thevehicle 100 but accessible from the vehicle 100 via a wireless communication interface.Such information may be provided e.g. by a third party provider. However, in some embodiments, the statistical information may comprise information overcertain traffic scenarios, which may present an increased probability of an accident, suchas for example unattended crossings, game fence ending, etc. Step 607 which may be performed only in some particular embodiments, comprises detect-ing a traffic structure related to increased probability of a collision. Such traffic structuremay comprise e.g. a pedestrian crossing, vicinity of a school or playground, road crossing,wild game zones, etc. Such traffic structure may be detected by the forward directed cam-era 110 of the vehicle 100 in some embodiments. However, in some other alternative em-bodiments, the traffic structure having an increased probability of a collision may be de-tected by a sensor based on electromagnetic radiation such as radio signals, light signalsetc. Step 608 comprises performing an action for avoiding a collision, when the predicted 604future position 210 of the VRU 200 is overlapping 220 the predicted 601 future path t1, t2,t3 of the vehicle 100. ln some embodiments, the action may be performed when the probability of a collision ex-ceeds a first threshold limit. 18 ln some embodiments, the probability of a collision may be increased at geographical posi-tions where a number of traffic accidents is exceeding a threshold limit and the action foravoiding a collision may be performed based on the probability of a collision. The action to be performed may comprise a silent warning visually or haptically displayedto the driver of the vehicle 100, an audible warning, a short brake jerk for alerting the driver,a full brake to standstill or an alert for warning the VFlU 200 of the collision risk, in someembodiments. Further the silent warning may be visually or haptically displayed to the driver of the vehicle100 when the probability of a collision exceeds a first threshold limit. The audible warningmay be emitted when the probability of a collision exceeds a second threshold limit. Fur-ther, the short brake jerk may be performed when the probability of a collision exceeds athird threshold limit. Furthermore, the full brake to standstill may be performed when theprobability of a collision exceeds a fourth threshold limit, in some embodiments. Figure 7 illustrates an embodiment of a system 700 for avoiding a potential collision be-tween the vehicle 100 and a VFlU 200. The system 700 may perform at least some of thepreviously described steps 601-608 according to the method 600 described above andillustrated in Figure 6. The system 700 comprises a control unit 310 in the vehicle 100. The control unit 310 isarranged for avoiding a potential collision between the vehicle 100 and a VRU 200. Thecontrol unit 310 is configured for predicting a future path t1, t2, t3 of the vehicle 100. Fur-ther the control unit 310 is configured for detecting the VFlU 200 and the position of theVRU 200 via a sensor 120. The control unit 310 is also configured for determining velocityof the detected VFlU 200. ln further addition, the control unit 310 is also configured for pre-dicting a future position of the detected VRU 200, based on the position of the detectedVRU 200 and the determined VFlU velocity. The control unit 310 is configured for perform-ing an action for avoiding a collision, when the predicted future position 210 of the VFlU200 is overlapping 220 the predicted future path t1, t2, t3 of the vehicle 100. The action for avoiding a collision may be dependent on the size of the overlap 220 be-tween the predicted future position 210 of the VRU 200 and the predicted future path t1, t2,t3 of the vehicle 100 in some embodiments. 19 Further the control unit 310 may be configured for predicting a future path t1, t2, t3 of thevehicle 100 which corresponds to a first area t1, t2, t3 occupied by the vehicle 100 during aset of future time frames. The predicted future position 210 of the VRU 200 may comprisea second area 210 wherein the VRU 200 may be expected to be situated at the set of fu-ture time frames. The probability of a collision to occur may be proportional to the overlap220 between the first area t1, t2, t3 and the second area 210. Also, the control unit 310may be configured for performing the action when the probability of a collision exceeds afirst threshold limit, in some embodiments. Further, in some embodiments, an overlap 220between an area T1 predicted to be occupied by the vehicle 100 close in time and the sec-ond area 210, predicted to be occupied by the VFlU 200 close in time may be consideredmore critical than an overlap 220 between an area T3 predicted to be occupied by the ve-hicle 100 more remote in time and the second area 210, predicted to be occupied by theVFlU 200 more remote in time. The control unit 310 may also be configured for generating control signals for performingthe action by emitting a silent warning visually or haptically displayed to the driver of thevehicle 100, an audible warning, a short brake jerk for alerting the driver, a full brake tostandstill or an alert for warning the VRU 200 of the collision risk. Further the control unit 310 may in addition be configured for generating control signals foremitting a silent warning, visually or haptically displayed to the driver of the vehicle 100when the probability of a collision exceeds a first threshold limit. The control unit 310 mayalso be configured for generating control signals for emitting an audible warning when theprobability of a collision exceeds a second threshold limit. Also, the control unit 310 mayalso be configured for generating control signals for performing a short brake jerk when theprobability of a collision exceeds a third threshold limit. The control unit 310 may be config-ured for generating control signals for performing a full brake to standstill when the prob-ability of a collision exceeds a fourth threshold limit. The control unit 310 may furthermore be configured for increasing the probability of a colli-sion when the VRU 200 is detected as an unattended animal or a person shorter than aconfigurable threshold length. Furthermore, the control unit 310 may be configured for determining geographical positionof the vehicle 100 in some embodiments. The control unit 310 may also be configured forextracting statistical information related to traffic accidents at the determined geographicalposition. Also, the control unit 310 may be further configured for increasing the probability of a collision at geographical positions where a number of traffic accidents is exceeding athreshold limit. Further, the control unit 310 may also be configured for detecting a trafficstructure related to increased probability of a collision in some embodiments. The control unit 310 may be configured for predicting the future path t1, t2, t3 of the vehicle100 by measuring velocity of the vehicle 100. Further the control unit 310 may be config-ured for measuring steering wheel angle orsw, in some embodiments. The control unit 310may also be configured for measuring steering wheel angle rate ugw. Also, the control unit310 may be configured for calculating a future steering wheel angle orsw, based on themeasured steering wheel angle orsw and the measured steering wheel angle rate orgw. Fur-ther the control unit 310 may also be configured for calculating a future yaw rate w of thevehicle 100 based on the measured velocity of the vehicle 100 and the calculated futuresteering wheel angle usw. ln further addition, the control unit 310 may also be configured forextrapolating a vehicle position of the vehicle 100 in a set of future time frames, based onthe calculated future yaw rate w and the vehicle velocity. Further, the control unit 310 mayalso be configured for predicting the path of the vehicle 100 based on the extrapolated ve-hicle positions in the set of future time frames, according to some alternative embodiments. ln further addition, the control unit 310 may also be configured for detecting the VRU 200and the position of the VFlU 200 by: detecting an object 200 by a camera 110 of the vehicle100; classifying the detected object 200 as a VFlU 200; detecting the object 200 by a sen-sor 120 of the vehicle 100; mapping the classified VRU 200 with the object 200 detected bythe sensor 120; and tracking the VRU 200 by the sensor 120. The control unit 310 comprises a receiving circuit 710 configured for receiving a signal fromthe sensor 320, from the positioning device 330 and/ or the camera 110. Further, the control unit 310 comprises a processor 720 configured for performing at leastsome steps of the method 600, according to some embodiments. Such processor 720 may comprise one or more instances of a processing circuit, i.e. aCentral Processing Unit (CPU), a processing unit, a processing circuit, an Application Spe-cific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may inter-pret and execute instructions. The herein utilised expression ”“processor”” may thus repre-sent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any,some or all of the ones enumerated above. 21 Furthermore, the control unit 310 may comprise a memory 725 in some embodiments. Theoptional memory 725 may comprise a physical device utilised to store data or programs,i.e., sequences of instructions, on a temporary or permanent basis. According to some em-bodiments, the memory 725 may comprise integrated circuits comprising silicon-basedtransistors. The memory 725 may comprise e.g. a memory card, a flash memory, a USBmemory, a hard disc, or another similar volatile or non-volatile storage unit for storing datasuch as e.g. ROIVI (Read-Only Memory), PROM (Programmable Read-Only Memory),EPROM (Erasable PROM), EEPROM (Electrically Erasable PFlOM), etc. in different em-bodiments. Further, the control unit 310 may comprise a signal transmitter 730. The signal transmitter730 may be configured for transmitting a control signal to e.g. a display device, or a VDU warning system or warning device 510, for example. The system 700 further comprises a sensor 120 in the vehicle 100 configured for detectingthe VFlU 200 and the position of the VFlU 200. ln addition the system 700 also comprises a warning emitting device 510 on the vehicle100, configured for emitting a warning for avoiding a collision. Further, in some alternative embodiments, the system 700 may comprise a positioningdevice 330 for determining geographical position of the vehicle 100 in some embodiments. The system 700 may furthermore comprise a camera 110 in the vehicle 100, in some em-bodiments. The system 700 may further comprise a sensor in the vehicle 100, configured for measur-ing steering wheel angle orsw and steering wheel angle rate dgw of the steering wheel of thevehicle 100. The sensor may comprise e.g. a camera, a stereo camera, an infrared cam-era, a video camera or similar. The above described steps 601-608 to be performed in the vehicle 100 may be imple-mented through the one or more processors 720 within the control unit 310, together withcomputer program product for performing at least some of the functions of the steps 601-608. Thus a computer program product, comprising instructions for performing the steps601-608 in the control unit 310 may perform the method 600 comprising at least some of 22 the steps 601-608 for predicting a path of the vehicle 100, when the computer program isloaded into the one or more processors 720 of the control unit 310. Further, some embodiments may comprise a vehicle 100, comprising the control unit 310,configured for avoiding a potential collision between the vehicle 100 and a VRU 200, ac-cording to at least some of the steps 601 -608. The computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the steps601-608 according to some embodiments when being loaded into the one or more proces-sors 720 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc,a memory stick, an optical storage device, a magnetic storage device or any other appro-priate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as com-puter program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection. The terminology used in the description of the embodiments as illustrated in the accompa-nylng drawings is not intended to be limiting of the described method 600; the control unit310; the computer program; the system 700 and/ or the vehicle 100. Various changes,substitutions and/ or alterations may be made, without departing from invention embodi-ments as defined by the appended claims. As used herein, the term "and/ or" comprises any and all combinations of one or more ofthe associated listed items. The term “or” as used herein, is to be interpreted as a mathe-matical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR),unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are tobe interpreted as “at least one", thus also possibly comprising a plurality of entities of thesame kind, unless expressly stated otherwise. lt will be further understood that the terms"includes", "comprises", "including" and/ or "comprising", specifies the presence of statedfeatures, actions, integers, steps, operations, elements, and/ or components, but do notpreclude the presence or addition of one or more other features, actions, integers, steps,operations, elements, components, and/ or groups thereof. A single unit such as e.g. aprocessor may fulfil the functions of several items recited in the claims. The mere fact thatcertain measures are recited in mutually different dependent claims does not indicate that acombination of these measures cannot be used to advantage. A computer program may bestored/ distributed on a suitable medium, such as an optical storage medium or a solid- 23 state medium supplied together with or as part of other hardware, but may also be distrib- uted in other forms such as via Internet or other wired or wireless communication system.
权利要求:
Claims (12) [1] 1. A method (600) in a vehicle (100), for avoiding a potential collision between thevehicle (100) and a Vulnerable Road User, VRU (200), wherein the method (600) com-pnses: predicting (601) a future path (t1, t2, t3) of the vehicle (100); detecting (602) the VRU (200) and the position of the VRU (200); determining (603) velocity of the detected (602) VRU (200); predicting (604) a future position (210) of the detected (602) VRU (200), based onthe VRU position upon detection (602) and the determined (603) VRU velocity; and performing (608) an action for avoiding a collision, when the predicted (604) futureposition (210) of the VRU (200) is overlapping (220) the predicted (601) future path (t1, t2,t3) of the vehicle (100). [2] 2. The method (600) according to claim 1, wherein the predicted (601) future path(t1, t2, t3) of the vehicle (100) corresponds to a first area (t1, t2, t3) occupied by the vehicle(100) during a set of future time frames; and wherein the predicted (604) future position(210) of the VRU (200) comprises a second area (210) wherein the VRU (200) is expectedto be situated at the set of future time frames; and wherein probability of a collision to occuris proportional to the overlap (220) between the first area (t1, t2, t3) and the second area(210); and wherein the action is performed (608) when the probability of a collision exceedsa first threshold limit. [3] 3. The method (600) according to any of claim 1 or claim 2, wherein the action to beperformed (608) comprises a silent warning visually or haptically displayed to the driver ofthe vehicle (100), an audible warning, a short brake jerk for alerting the driver, a full braketo standstill or an alert for warning the VRU (200) of the collision risk. [4] 4. The method (600) according to any of claim 2 or claim 3, wherein the silent warn-ing is visually or haptically displayed to the driver of the vehicle (100) when the probabilityof a collision exceeds a first threshold limit; the audible warning is emitted when the prob-ability of a collision exceeds a second threshold limit; the short brake jerk is performedwhen the probability of a collision exceeds a third threshold limit; the full brake to standstillis performed when the probability of a collision exceeds a fourth threshold limit. [5] 5. The method (600) according to any of claims 1-4, wherein the probability of a col-lision is increased when the VRU (200) is detected (602) and classified as an unattendedanimal or a person shorter than a configurable threshold length. [6] 6. The method (600) according to any of claims 1-5, further comprising: determining (605) geographical position of the vehicle (100); extracting (606) statistical information related to a probability of a collision at thedetermined (605) geographical position; and wherein the probability of a collision is in-creased at geographical positions where a number of traffic accidents is exceeding athreshold limit; and wherein the action for avoiding a collision is performed (608) based onthe probability of a collision. [7] 7. The method (600) according to any of claims 1-6, further comprising:detecting (607) a traffic structure related to increased probability of a collision; andwherein the action for avoiding a collision is performed (608) based on the probability of a collision. [8] 8. The method (600) according to any of claims 1-7, wherein the future path (t1, t2,t3) of the vehicle (100) is predicted (601) by: measuring velocity of the vehicle (100); measuring steering wheel angle (dsw); measuring steering wheel angle rate (dgw); calculating a future steering wheel angle (orsw), based on the measured steeringwheel angle (dsw) and the measured steering wheel angle rate (d'sw); calculating a future yaw rate (w) of the vehicle (100) based on the measured ve-locity of the vehicle (100) and the calculated future steering wheel angle (dSW); extrapolating a vehicle position of the vehicle (100) in a set of future time frames,based on the calculated future yaw rate (w) and the vehicle velocity; and predicting the path of the vehicle (100) based on the extrapolated vehicle positionsin the set of future time frames. [9] 9. The method (600) according to any of claims 1-8, wherein the detection (602) ofthe VRU (200) and the position of the VRU (200) comprises: detecting an object (200) by a camera (110) of the vehicle (100); classifying the detected object (200) as a VFlU (200); detecting the object (200) by a sensor (120) of the vehicle (100); mapping the classified VRU (200) with the object (200) detected by the sensor(120); and tracking the VRU (200) by the sensor (120). 26 [10] 10. A control unit (310) in a vehicle (100), for avoiding a potential collision betweenthe vehicle (100) and a Vulnerable Road User, VRU (200), wherein the control unit (310) isconfigured for: predicting a future path (t1, t2, t3) of the vehicle (100); detecting the VFlU (200) and the position of the VFlU (200) via a sensor (120); determining velocity of the detected VFlU (200); predicting a future position of the detected VRU (200) based on the position of thedetected VRU (200) and the determined VRU velocity; and performing an action for avoiding a collision, when the predicted future position(210) of the VFlU (200) is overlapping (220) the predicted future path (t1, t2, t3) of the vehi-cle (100). [11] 11.cording to any of claims 1-9 when the computer program is executed in a processor in a A computer program comprising program code for performing a method (600) ac- control unit (310), according to claim 10. [12] 12. A system (700) for avoiding a potential collision between the vehicle (100) and aVulnerable Road User, VFlU (200), wherein the system (700) comprises: a control unit (310) in the vehicle (100), according to claim 10; a sensor (120) on the vehicle (100), configured for detecting the VFlU (200) andthe position of the VFlU (200); a warning emitting device (510) on the vehicle (100), configured for emitting awarning for avoiding a collision.
类似技术:
公开号 | 公开日 | 专利标题 US11056002B2|2021-07-06|Method, control unit and system for avoiding collision with vulnerable road users US11080538B2|2021-08-03|Safety system for a vehicle to detect and warn of a potential collision US11155249B2|2021-10-26|Systems and methods for causing a vehicle response based on traffic light detection US20180075747A1|2018-03-15|Systems, apparatus, and methods for improving safety related to movable/ moving objects US10077007B2|2018-09-18|Sidepod stereo camera system for an autonomous vehicle EP3216667B1|2020-05-06|Control system for vehicle US20130058116A1|2013-03-07|Method and device for changing a light emission of at least one headlight of a vehicle US10481606B1|2019-11-19|Self-driving vehicle systems and methods US10976748B2|2021-04-13|Detecting and responding to sounds for autonomous vehicles US20180222475A1|2018-08-09|Method, control unit and system for path prediction in a vehicle WO2017030494A1|2017-02-23|Method, control unit and system for detecting and tracking vulnerable road users KR20210083462A|2021-07-07|Advanced Driver Assistance System, Vehicle having the same and method for controlling the vehicle SE1550100A1|2016-08-03|Method, control unit and system for warning US20200143684A1|2020-05-07|Vehicle Threat Mitigation Technologies JP2020135043A|2020-08-31|Automatic driving system WO2019098124A1|2019-05-23|Dangerous place identification device, map data, dangerous place identification method, and program US20210221363A1|2021-07-22|Systems and methods for adapting a driving assistance system according to the presence of a trailer GB2524894A|2015-10-07|Traffic density sensitivity selector SE540361C2|2018-08-07|Method, control unit and system for reducing false alerts
同族专利:
公开号 | 公开日 US20180233048A1|2018-08-16| KR102072188B1|2020-01-31| EP3338266A1|2018-06-27| EP3338266A4|2019-04-24| SE539097C2|2017-04-11| WO2017030493A1|2017-02-23| BR112018001990A2|2018-09-18| KR20180039700A|2018-04-18| US11056002B2|2021-07-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP3174832B2|1999-10-27|2001-06-11|建設省土木研究所長|Crossing pedestrian collision prevention system| JP2001328451A|2000-05-18|2001-11-27|Denso Corp|Travel route estimating device, preceding vehicle recognizing device and recording medium| JP2004508627A|2000-09-08|2004-03-18|レイセオン・カンパニー|Route prediction system and method| JP3860061B2|2002-04-16|2006-12-20|富士重工業株式会社|Outside-of-vehicle monitoring device and travel control device equipped with this out-of-vehicle monitoring device| EP1779295A4|2004-07-26|2012-07-04|Automotive Systems Lab|Vulnerable road user protection system| JP4304517B2|2005-11-09|2009-07-29|トヨタ自動車株式会社|Object detection device| US8725309B2|2007-04-02|2014-05-13|Panasonic Corporation|Safety driving support apparatus| US9239380B2|2008-05-21|2016-01-19|Adc Automotive Distance Control Systems Gmbh|Driver assistance system for avoiding collisions of a vehicle with pedestrians| US20100100324A1|2008-10-22|2010-04-22|Toyota Motor Engineering & Manufacturing North America, Inc.|Communication based vehicle-pedestrian collision warning system| DE102008062916A1|2008-12-23|2010-06-24|Continental Safety Engineering International Gmbh|Method for determining a collision probability of a vehicle with a living being| JP5210233B2|2009-04-14|2013-06-12|日立オートモティブシステムズ株式会社|Vehicle external recognition device and vehicle system using the same| US20110098922A1|2009-10-27|2011-04-28|Visteon Global Technologies, Inc.|Path Predictive System And Method For Vehicles| JP2011118483A|2009-11-30|2011-06-16|Fujitsu Ten Ltd|On-vehicle device and recognition support system| WO2011095974A1|2010-02-03|2011-08-11|Liron Elia|Methods and systems of collision avoidance and/or alerting for users of mobile communication devices| KR20120052479A|2010-11-16|2012-05-24|주식회사 만도|Apparatus for warning collision of vehicle and control method thereof| JP5189157B2|2010-11-30|2013-04-24|株式会社豊田中央研究所|Moving object target state determination device and program| FR2988507B1|2012-03-23|2014-04-25|Inst Francais Des Sciences Et Technologies Des Transports De Lamenagement Et Des Reseaux|ASSISTANCE SYSTEM FOR A ROAD VEHICLE| US20130278441A1|2012-04-24|2013-10-24|Zetta Research and Development, LLC - ForC Series|Vehicle proxying| US9196164B1|2012-09-27|2015-11-24|Google Inc.|Pedestrian notifications| JP5905846B2|2013-03-29|2016-04-20|株式会社日本自動車部品総合研究所|Crossing determination device and program| SE539051C2|2013-07-18|2017-03-28|Scania Cv Ab|Sensor detection management| DE102013214383A1|2013-07-23|2015-01-29|Robert Bosch Gmbh|Method and device for providing a collision signal with regard to a vehicle collision, method and device for managing collision data regarding vehicle collisions, and method and device for controlling at least one collision protection device of a vehicle| US10210761B2|2013-09-30|2019-02-19|Sackett Solutions & Innovations, LLC|Driving assistance systems and methods| FR3012784B1|2013-11-04|2016-12-30|Renault Sa|DEVICE FOR DETECTING THE LATERAL POSITION OF A PIETON IN RELATION TO THE TRACK OF THE VEHICLE| EP2883757B1|2013-12-12|2018-05-16|Volvo Car Corporation|Method for setting a collision protection system of a vehicle| US9656606B1|2014-05-30|2017-05-23|State Farm Mutual Automobile Insurance Company|Systems and methods for alerting a driver to vehicle collision risks| US9896030B2|2015-04-30|2018-02-20|Honda Motor Co., Ltd.|System and method for vehicle collision mitigation with vulnerable road user context sensing| JP6481520B2|2015-06-05|2019-03-13|トヨタ自動車株式会社|Vehicle collision avoidance support device|JP6294342B2|2013-11-05|2018-03-14|株式会社日立製作所|Autonomous mobile system| CN108496178A|2016-01-05|2018-09-04|御眼视觉技术有限公司|System and method for estimating Future Path| FR3056490B1|2016-09-29|2018-10-12|Valeo Vision|METHOD FOR PROJECTING AN IMAGE BY A PROJECTION SYSTEM OF A MOTOR VEHICLE, AND ASSOCIATED PROJECTION SYSTEM| JP6271674B1|2016-10-20|2018-01-31|パナソニック株式会社|Pedestrian communication system, in-vehicle terminal device, pedestrian terminal device, and safe driving support method| US10089880B2|2016-11-08|2018-10-02|International Business Machines Corporation|Warning driver of intent of others| CN108303698B|2016-12-29|2021-05-04|宏达国际电子股份有限公司|Tracking system, tracking device and tracking method| EP3370085B1|2017-03-01|2021-10-13|Aptiv Technologies Limited|Method of tracking a plurality of objects in the vicinity of a host vehicle| US10421399B2|2017-05-26|2019-09-24|GM Global Technology Operations LLC|Driver alert systems and methods based on the presence of cyclists| JP6580087B2|2017-06-02|2019-09-25|本田技研工業株式会社|Traveling track determination device and automatic driving device| KR20190031951A|2017-09-19|2019-03-27|삼성전자주식회사|An electronic device and Method for controlling the electronic device thereof| EP3712873A4|2017-11-14|2021-06-23|Pioneer Corporation|Dangerous place identification device, map data, dangerous place identification method, and program| FR3077876B1|2018-02-12|2020-09-18|Cad 42 Services|METHOD AND SYSTEM FOR DETECTION AND EVALUATION OF THE DANGEROSITY OF AT LEAST ONE DANGER ZONE| JP2019156008A|2018-03-08|2019-09-19|本田技研工業株式会社|Vehicle controller, vehicle control method and program| US11257370B2|2018-03-19|2022-02-22|Derq Inc.|Early warning and collision avoidance| US10943485B2|2018-04-03|2021-03-09|Baidu Usa Llc|Perception assistant for autonomous driving vehicles | KR102106345B1|2018-04-25|2020-05-04|주식회사 만도|Rear-side alert system and controlling method thereof| KR102035135B1|2018-05-03|2019-10-22|엘지전자 주식회사|vehicle accident information providing system| EP3594035B1|2018-07-10|2021-01-13|Ningbo Geely Automobile Research & Development Co., Ltd.|A vehicle comprising a door opening warning system| US20210268998A1|2018-07-12|2021-09-02|Wabco Gmbh|Information, warning and braking request generation for turn assist functionality| US10817733B2|2019-02-13|2020-10-27|Sap Se|Blind spot implementation in neural networks| CN111554124A|2020-04-16|2020-08-18|天津职业技术师范大学|Intersection truck right-turning anti-collision early warning system and early warning method| WO2021260432A1|2020-06-24|2021-12-30|Humanising Autonomy Limited|Appearance and movement based model for determining risk of micro mobility users| CN112286188A|2020-10-20|2021-01-29|腾讯科技(深圳)有限公司|Vehicle driving control method, device, equipment and computer readable storage medium|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 SE1551086A|SE539097C2|2015-08-20|2015-08-20|Method, control unit and system for avoiding collision with vulnerable road users|SE1551086A| SE539097C2|2015-08-20|2015-08-20|Method, control unit and system for avoiding collision with vulnerable road users| KR1020187006949A| KR102072188B1|2015-08-20|2016-08-16|Methods, control units and systems for avoiding collisions with vulnerable road users| EP16837399.1A| EP3338266A4|2015-08-20|2016-08-16|Method, control unit and system for avoiding collision with vulnerable road users| BR112018001990-9A| BR112018001990A2|2015-08-20|2016-08-16|method, control unit and system to prevent collision with vulnerable road users| US15/750,154| US11056002B2|2015-08-20|2016-08-16|Method, control unit and system for avoiding collision with vulnerable road users| PCT/SE2016/050761| WO2017030493A1|2015-08-20|2016-08-16|Method, control unit and system for avoiding collision with vulnerable road users| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|